610 research outputs found

    Effects of speeding up or slowing down animate or inanimate motions on timing

    Get PDF
    It has recently been suggested that time perception and motor timing are influenced by the presence of biological movements and animacy in the visual scene. Here, we investigated the interactions among timing, speed and animacy in two experiments. In Experiment 1, observers had to press a button in synchrony with the landing of a falling ball while a dancer or a whirligig moved in the background of the scene. The speed of these two characters was artificially changed across sessions. We found striking differences in the timing of button-press responses as a function of the condition. Responses were delayed considerably with increasing speed of the whirligig. By contrast, the effect of the dancer's speed was weaker and in the opposite direction. In Experiment 2, we assessed the perceived animacy of these characters and found that the dancer was rated as much more animate than the whirligig, irrespective of the character speed. The results are consistent with the hypothesis that event timers are selectively biased as a function of perceived animacy, implicating high-level mechanisms for time modulation. However, response timing interacts with perceived animacy and speed in a complex manner

    Control of reaching movements by muscle synergy combinations

    Get PDF
    Controlling the movement of the arm to achieve a goal, such as reaching for an object, is challenging because it requires coordinating many muscles acting on many joints. The central nervous system (CNS) might simplify the control of reaching by directly mapping initial states and goals into muscle activations through the combination of muscle synergies, coordinated recruitment of groups of muscles with specific activation profiles. Here we review recent results from the analysis of reaching muscle patterns supporting such a control strategy. Muscle patterns for point-to-point movements can be reconstructed by the combination of a small number of time-varying muscle synergies, modulated in amplitude and timing according to movement directions and speeds. Moreover, the modulation and superposition of the synergies identified from point-to-point movements captures the muscle patterns underlying multi-phasic movements, such as reaching through a via-point or to a target whose location changes after movement initiation. Thus, the sequencing of time-varying muscle synergies might implement an intermittent controller which would allow the construction of complex movements from simple building blocks

    Mapping Muscles Activation to Force Perception during Unloading

    Get PDF
    It has been largely proved that while judging a force humans mainly rely on the motor commands produced to interact with that force (i.e., sense of effort). Despite of a large bulk of previous investigations interested in understanding the contributions of the descending and ascending signals in force perception, very few attempts have been made to link a measure of neural output (i.e., EMG) to the psychophysical performance. Indeed, the amount of correlation between EMG activity and perceptual decisions can be interpreted as an estimate of the contribution of central signals involved in the sensation of force. In this study we investigated this correlation by measuring the muscular activity of eight arm muscles while participants performed a quasi-isometric force detection task. Here we showed a method to quantitatively describe muscular activity ("muscle-metric function") that was directly comparable to the description of the participants' psychophysical decisions about the stimulus force. We observed that under our experimental conditions, muscle-metric absolute thresholds and the shape of the muscle-metric curves were closely related to those provided by the psychophysics. In fact a global measure of the muscles considered was able to predict approximately 60% of the perceptual decisions total variance. Moreover the inter-subjects differences in psychophysical sensitivity showed high correlation with both participants' muscles sensitivity and participants' joint torques. Overall, our findings gave insights into both the role played by the corticospinal motor commands while performing a force detection task and the influence of the gravitational muscular torque on the estimation of vertical forces

    The weight of time: gravitational force enhances discrimination of visual motion duration

    Get PDF
    In contrast with the anisotropies in spatial and motion vision, anisotropies in the perception of motion duration have not been investigated to our knowledge. Here, we addressed this issue by asking observers to judge the duration of motion of a target accelerating over a fixed length path in one of different directions. Observers watched either a pictorial or a quasi-blank scene, while being upright or tilted by 45° relative to the monitor and Earth's gravity. Finally, observers were upright and we tilted the scene by 45°. We found systematic anisotropies in the precision of the responses, the performance being better for downward motion than for upward motion relative to the scene both when the observer and the scene were upright and when either the observer or the scene were tilted by 45°, although tilting decreased the size of the effect. We argue that implicit knowledge about gravity force is incorporated in the neural mechanisms computing elapsed time. Furthermore, the results suggest that the effects of a virtual gravity can be represented with respect to a vertical direction concordant with the visual scene orientation and discordant with the direction of Earth's gravity

    Development of human locomotion

    Get PDF
    Neural control of locomotion in human adults involves the generation of a small set of basic patterned commands directed to the leg muscles. The commands are generated sequentially in time during each step by neural networks located in the spinal cord, called Central Pattern Generators. This review outlines recent advances in understanding how motor commands are expressed at different stages of human development. Similar commands are found in several other vertebrates, indicating that locomotion development follows common principles of organization of the control networks. Movements show a high degree of flexibility at all stages of development, which is instrumental for learning and exploration of variable interactions with the environment

    Anticipating the effects of visual gravity during simulated self-motion: estimates of time-to-passage along vertical and horizontal paths

    Get PDF
    By simulating self-motion on a virtual rollercoaster, we investigated whether acceleration cued by the optic flow affected the estimate of time-to-passage (TTP) to a target. In particular, we studied the role of a visual acceleration (1 g = 9.8 m/s(2)) simulating the effects of gravity in the scene, by manipulating motion law (accelerated or decelerated at 1 g, constant speed) and motion orientation (vertical, horizontal). Thus, 1-g-accelerated motion in the downward direction or decelerated motion in the upward direction was congruent with the effects of visual gravity. We found that acceleration (positive or negative) is taken into account but is overestimated in module in the calculation of TTP, independently of orientation. In addition, participants signaled TTP earlier when the rollercoaster accelerated downward at 1 g (as during free fall), with respect to when the same acceleration occurred along the horizontal orientation. This time shift indicates an influence of the orientation relative to visual gravity on response timing that could be attributed to the anticipation of the effects of visual gravity on self-motion along the vertical, but not the horizontal orientation. Finally, precision in TTP estimates was higher during vertical fall than when traveling at constant speed along the vertical orientation, consistent with a higher noise in TTP estimates when the motion violates gravity constraints

    Mental imagery of object motion in weightlessness

    Get PDF
    Mental imagery represents a potential countermeasure for sensorimotor and cognitive dysfunctions due to spaceflight. It might help train people to deal with conditions unique to spaceflight. Thus, dynamic interactions with the inertial motion of weightless objects are only experienced in weightlessness but can be simulated on Earth using mental imagery. Such training might overcome the problem of calibrating fine-grained hand forces and estimating the spatiotemporal parameters of the resulting object motion. Here, a group of astronauts grasped an imaginary ball, threw it against the ceiling or the front wall, and caught it after the bounce, during pre-flight, in-flight, and post-flight experiments. They varied the throwing speed across trials and imagined that the ball moved under Earth's gravity or weightlessness. We found that the astronauts were able to reproduce qualitative differences between inertial and gravitational motion already on ground, and further adapted their behavior during spaceflight. Thus, they adjusted the throwing speed and the catching time, equivalent to the duration of virtual ball motion, as a function of the imaginary 0 g condition versus the imaginary 1 g condition. Arm kinematics of the frontal throws further revealed a differential processing of imagined gravity level in terms of the spatial features of the arm and virtual ball trajectories. We suggest that protocols of this kind may facilitate sensorimotor adaptation and help tuning vestibular plasticity in-flight, since mental imagery of gravitational motion is known to engage the vestibular cortex

    Neural extrapolation of motion for a ball rolling down an inclined plane

    Get PDF
    It is known that humans tend to misjudge the kinematics of a target rolling down an inclined plane. Because visuomotor responses are often more accurate and less prone to perceptual illusions than cognitive judgments, we asked the question of how rolling motion is extrapolated for manual interception or drawing tasks. In three experiments a ball rolled down an incline with kinematics that differed as a function of the starting position (4 different positions) and slope (30°, 45° or 60°). In Experiment 1, participants had to punch the ball as it fell off the incline. In Experiment 2, the ball rolled down the incline but was stopped at the end; participants were asked to imagine that the ball kept moving and to punch it. In Experiment 3, the ball rolled down the incline and was stopped at the end; participants were asked to draw with the hand in air the trajectory that would be described by the ball if it kept moving. We found that performance was most accurate when motion of the ball was visible until interception and haptic feedback of hand-ball contact was available (Experiment 1). However, even when participants punched an imaginary moving ball (Experiment 2) or drew in air the imaginary trajectory (Experiment 3), they were able to extrapolate to some extent global aspects of the target motion, including its path, speed and arrival time. We argue that the path and kinematics of a ball rolling down an incline can be extrapolated surprisingly well by the brain using both visual information and internal models of target motion

    Catching what we can't see: manual interception of occluded fly-ball trajectories

    Get PDF
    Control of interceptive actions may involve fine interplay between feedback-based and predictive mechanisms. These processes rely heavily on target motion information available when the target is visible. However, short-term visual memory signals as well as implicit knowledge about the environment may also contribute to elaborate a predictive representation of the target trajectory, especially when visual feedback is partially unavailable because other objects occlude the visual target. To determine how different processes and information sources are integrated in the control of the interceptive action, we manipulated a computer-generated visual environment representing a baseball game. Twenty-four subjects intercepted fly-ball trajectories by moving a mouse cursor and by indicating the interception with a button press. In two separate sessions, fly-ball trajectories were either fully visible or occluded for 750, 1000 or 1250 ms before ball landing. Natural ball motion was perturbed during the descending trajectory with effects of either weightlessness (0 g) or increased gravity (2 g) at times such that, for occluded trajectories, 500 ms of perturbed motion were visible before ball disappearance. To examine the contribution of previous visual experience with the perturbed trajectories to the interception of invisible targets, the order of visible and occluded sessions was permuted among subjects. Under these experimental conditions, we showed that, with fully visible targets, subjects combined servo-control and predictive strategies. Instead, when intercepting occluded targets, subjects relied mostly on predictive mechanisms based, however, on different type of information depending on previous visual experience. In fact, subjects without prior experience of the perturbed trajectories showed interceptive errors consistent with predictive estimates of the ball trajectory based on a-priori knowledge of gravity. Conversely, the interceptive responses of subjects previously exposed to fully visible trajectories were compatible with the fact that implicit knowledge of the perturbed motion was also taken into account for the extrapolation of occluded trajectories
    • …
    corecore